Invesitgating the 2022 Drought Effects in NDWI within Leicestershire¶

Main Objectives:¶

  1. Acquire Monthly Sentinel-2 Images from April 2022 - October 2022
  2. Calculate NDWI for each image composite
  3. Produce series of NDWI Maps
  4. Produce a map of showcasing where the drought impacts were most severe
  5. Calculate monthly NDWI stats on areas of interest (e.g. agricultural field, pasture, and urban area)

Subsidary Objectives:¶

  1. Make a movie of NDWI images over our interested time period
  2. Plot Mean / Median NDWI Time Series
  3. Plot NDWI band frequency for further investigation on drought effects
  4. Additionally, for zonal stats, investigate a water body area of interest

Intial Setup¶

’’’¶

The following blocks of code is modified from: Balzter, H. (2023) practical_week33_zonal_statistics.ipynb (17 January 2023) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

In [2]:
# Mount Google Drive 
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
In [3]:
# install some libraries that are not on Colab by default
!pip install rasterio
!pip install geopandas
!pip install rasterstats
!pip install earthengine-api
!pip install requests
!pip install sentinelsat
!pip install  contextily



# import libraries
import json
import geopandas as gpd
import contextily as cx
import matplotlib.pyplot as plt
import math
import numpy as np
from osgeo import gdal, ogr
import os
from os import listdir
from os.path import isfile, isdir, join
import pandas as pd
from pprint import pprint
import rasterio
from rasterio import plot
from rasterio.plot import show_hist
from scipy import optimize
import shutil
import sys
import zipfile
import requests
import io
import webbrowser
import ee
import pickle
from rasterstats import zonal_stats
import warnings
warnings.simplefilter(action='ignore', category=FutureWarning)

# define the root directory where our data are stored - adjust this to your directory names
rootdir = '/content/drive/MyDrive/GY7709-2022-23' # this is where pygge.py needs to be saved
# make sure that this path points to the location of the pygge module on your Google Drive
if rootdir not in sys.path:
  sys.path.append(rootdir)
# import the pygge library of functions for this module
import pygge

%matplotlib inline
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting rasterio
  Downloading rasterio-1.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (20.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 20.0/20.0 MB 49.2 MB/s eta 0:00:00
Collecting affine (from rasterio)
  Downloading affine-2.4.0-py3-none-any.whl (15 kB)
Requirement already satisfied: attrs in /usr/local/lib/python3.10/dist-packages (from rasterio) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from rasterio) (2022.12.7)
Requirement already satisfied: click>=4.0 in /usr/local/lib/python3.10/dist-packages (from rasterio) (8.1.3)
Collecting cligj>=0.5 (from rasterio)
  Downloading cligj-0.7.2-py3-none-any.whl (7.1 kB)
Requirement already satisfied: numpy>=1.18 in /usr/local/lib/python3.10/dist-packages (from rasterio) (1.22.4)
Collecting snuggs>=1.4.1 (from rasterio)
  Downloading snuggs-1.4.7-py3-none-any.whl (5.4 kB)
Collecting click-plugins (from rasterio)
  Downloading click_plugins-1.1.1-py2.py3-none-any.whl (7.5 kB)
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from rasterio) (67.7.2)
Requirement already satisfied: pyparsing>=2.1.6 in /usr/local/lib/python3.10/dist-packages (from snuggs>=1.4.1->rasterio) (3.0.9)
Installing collected packages: snuggs, cligj, click-plugins, affine, rasterio
Successfully installed affine-2.4.0 click-plugins-1.1.1 cligj-0.7.2 rasterio-1.3.6 snuggs-1.4.7
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting geopandas
  Downloading geopandas-0.13.0-py3-none-any.whl (1.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 55.9 MB/s eta 0:00:00
Collecting fiona>=1.8.19 (from geopandas)
  Downloading Fiona-1.9.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.5/16.5 MB 118.9 MB/s eta 0:00:00
Requirement already satisfied: packaging in /usr/local/lib/python3.10/dist-packages (from geopandas) (23.1)
Requirement already satisfied: pandas>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from geopandas) (1.5.3)
Collecting pyproj>=3.0.1 (from geopandas)
  Downloading pyproj-3.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.7/7.7 MB 113.1 MB/s eta 0:00:00
Requirement already satisfied: shapely>=1.7.1 in /usr/local/lib/python3.10/dist-packages (from geopandas) (2.0.1)
Requirement already satisfied: attrs>=19.2.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (2022.12.7)
Requirement already satisfied: click~=8.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (8.1.3)
Requirement already satisfied: click-plugins>=1.0 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (1.1.1)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (0.7.2)
Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from fiona>=1.8.19->geopandas) (1.16.0)
Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (2022.7.1)
Requirement already satisfied: numpy>=1.21.0 in /usr/local/lib/python3.10/dist-packages (from pandas>=1.1.0->geopandas) (1.22.4)
Installing collected packages: pyproj, fiona, geopandas
Successfully installed fiona-1.9.4 geopandas-0.13.0 pyproj-3.5.0
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting rasterstats
  Downloading rasterstats-0.18.0-py3-none-any.whl (17 kB)
Requirement already satisfied: affine<3.0 in /usr/local/lib/python3.10/dist-packages (from rasterstats) (2.4.0)
Requirement already satisfied: click>7.1 in /usr/local/lib/python3.10/dist-packages (from rasterstats) (8.1.3)
Requirement already satisfied: cligj>=0.4 in /usr/local/lib/python3.10/dist-packages (from rasterstats) (0.7.2)
Collecting fiona<1.9 (from rasterstats)
  Downloading Fiona-1.8.22-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.6/16.6 MB 24.1 MB/s eta 0:00:00
Requirement already satisfied: numpy>=1.9 in /usr/local/lib/python3.10/dist-packages (from rasterstats) (1.22.4)
Requirement already satisfied: rasterio>=1.0 in /usr/local/lib/python3.10/dist-packages (from rasterstats) (1.3.6)
Collecting simplejson (from rasterstats)
  Downloading simplejson-3.19.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (137 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 137.9/137.9 kB 19.5 MB/s eta 0:00:00
Requirement already satisfied: shapely in /usr/local/lib/python3.10/dist-packages (from rasterstats) (2.0.1)
Requirement already satisfied: attrs>=17 in /usr/local/lib/python3.10/dist-packages (from fiona<1.9->rasterstats) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from fiona<1.9->rasterstats) (2022.12.7)
Requirement already satisfied: click-plugins>=1.0 in /usr/local/lib/python3.10/dist-packages (from fiona<1.9->rasterstats) (1.1.1)
Requirement already satisfied: six>=1.7 in /usr/local/lib/python3.10/dist-packages (from fiona<1.9->rasterstats) (1.16.0)
Collecting munch (from fiona<1.9->rasterstats)
  Downloading munch-3.0.0-py2.py3-none-any.whl (10 kB)
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from fiona<1.9->rasterstats) (67.7.2)
Requirement already satisfied: snuggs>=1.4.1 in /usr/local/lib/python3.10/dist-packages (from rasterio>=1.0->rasterstats) (1.4.7)
Requirement already satisfied: pyparsing>=2.1.6 in /usr/local/lib/python3.10/dist-packages (from snuggs>=1.4.1->rasterio>=1.0->rasterstats) (3.0.9)
Installing collected packages: simplejson, munch, fiona, rasterstats
  Attempting uninstall: fiona
    Found existing installation: Fiona 1.9.4
    Uninstalling Fiona-1.9.4:
      Successfully uninstalled Fiona-1.9.4
Successfully installed fiona-1.8.22 munch-3.0.0 rasterstats-0.18.0 simplejson-3.19.1
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: earthengine-api in /usr/local/lib/python3.10/dist-packages (0.1.350)
Requirement already satisfied: google-cloud-storage in /usr/local/lib/python3.10/dist-packages (from earthengine-api) (2.8.0)
Requirement already satisfied: google-api-python-client>=1.12.1 in /usr/local/lib/python3.10/dist-packages (from earthengine-api) (2.84.0)
Requirement already satisfied: google-auth>=1.4.1 in /usr/local/lib/python3.10/dist-packages (from earthengine-api) (2.17.3)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /usr/local/lib/python3.10/dist-packages (from earthengine-api) (0.1.0)
Requirement already satisfied: httplib2<1dev,>=0.9.2 in /usr/local/lib/python3.10/dist-packages (from earthengine-api) (0.21.0)
Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from earthengine-api) (2.27.1)
Requirement already satisfied: google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 in /usr/local/lib/python3.10/dist-packages (from google-api-python-client>=1.12.1->earthengine-api) (2.11.0)
Requirement already satisfied: uritemplate<5,>=3.0.1 in /usr/local/lib/python3.10/dist-packages (from google-api-python-client>=1.12.1->earthengine-api) (4.1.1)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from google-auth>=1.4.1->earthengine-api) (5.3.0)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.10/dist-packages (from google-auth>=1.4.1->earthengine-api) (0.3.0)
Requirement already satisfied: six>=1.9.0 in /usr/local/lib/python3.10/dist-packages (from google-auth>=1.4.1->earthengine-api) (1.16.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.10/dist-packages (from google-auth>=1.4.1->earthengine-api) (4.9)
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in /usr/local/lib/python3.10/dist-packages (from httplib2<1dev,>=0.9.2->earthengine-api) (3.0.9)
Requirement already satisfied: google-cloud-core<3.0dev,>=2.3.0 in /usr/local/lib/python3.10/dist-packages (from google-cloud-storage->earthengine-api) (2.3.2)
Requirement already satisfied: google-resumable-media>=2.3.2 in /usr/local/lib/python3.10/dist-packages (from google-cloud-storage->earthengine-api) (2.5.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests->earthengine-api) (1.26.15)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests->earthengine-api) (2022.12.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests->earthengine-api) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->earthengine-api) (3.4)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.56.2 in /usr/local/lib/python3.10/dist-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-api-python-client>=1.12.1->earthengine-api) (1.59.0)
Requirement already satisfied: protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5 in /usr/local/lib/python3.10/dist-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-api-python-client>=1.12.1->earthengine-api) (3.20.3)
Requirement already satisfied: google-crc32c<2.0dev,>=1.0 in /usr/local/lib/python3.10/dist-packages (from google-resumable-media>=2.3.2->google-cloud-storage->earthengine-api) (1.5.0)
Requirement already satisfied: pyasn1<0.6.0,>=0.4.6 in /usr/local/lib/python3.10/dist-packages (from pyasn1-modules>=0.2.1->google-auth>=1.4.1->earthengine-api) (0.5.0)
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (2.27.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests) (1.26.15)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests) (2022.12.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests) (3.4)
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting sentinelsat
  Downloading sentinelsat-1.2.1-py3-none-any.whl (48 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.8/48.8 kB 4.7 MB/s eta 0:00:00
Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from sentinelsat) (2.27.1)
Requirement already satisfied: click>=7.1 in /usr/local/lib/python3.10/dist-packages (from sentinelsat) (8.1.3)
Collecting html2text (from sentinelsat)
  Downloading html2text-2020.1.16-py3-none-any.whl (32 kB)
Collecting geojson>=2 (from sentinelsat)
  Downloading geojson-3.0.1-py3-none-any.whl (15 kB)
Requirement already satisfied: tqdm>=4.58 in /usr/local/lib/python3.10/dist-packages (from sentinelsat) (4.65.0)
Collecting geomet (from sentinelsat)
  Downloading geomet-1.0.0-py3-none-any.whl (28 kB)
Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from geomet->sentinelsat) (1.16.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests->sentinelsat) (1.26.15)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests->sentinelsat) (2022.12.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests->sentinelsat) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->sentinelsat) (3.4)
Installing collected packages: html2text, geomet, geojson, sentinelsat
Successfully installed geojson-3.0.1 geomet-1.0.0 html2text-2020.1.16 sentinelsat-1.2.1
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting contextily
  Downloading contextily-1.3.0-py3-none-any.whl (16 kB)
Requirement already satisfied: geopy in /usr/local/lib/python3.10/dist-packages (from contextily) (2.3.0)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.10/dist-packages (from contextily) (3.7.1)
Collecting mercantile (from contextily)
  Downloading mercantile-1.2.1-py3-none-any.whl (14 kB)
Requirement already satisfied: pillow in /usr/local/lib/python3.10/dist-packages (from contextily) (8.4.0)
Requirement already satisfied: rasterio in /usr/local/lib/python3.10/dist-packages (from contextily) (1.3.6)
Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from contextily) (2.27.1)
Requirement already satisfied: joblib in /usr/local/lib/python3.10/dist-packages (from contextily) (1.2.0)
Collecting xyzservices (from contextily)
  Downloading xyzservices-2023.5.0-py3-none-any.whl (56 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.5/56.5 kB 7.8 MB/s eta 0:00:00
Requirement already satisfied: geographiclib<3,>=1.52 in /usr/local/lib/python3.10/dist-packages (from geopy->contextily) (2.0)
Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (1.0.7)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (4.39.3)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (1.4.4)
Requirement already satisfied: numpy>=1.20 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (1.22.4)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (23.1)
Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7 in /usr/local/lib/python3.10/dist-packages (from matplotlib->contextily) (2.8.2)
Requirement already satisfied: click>=3.0 in /usr/local/lib/python3.10/dist-packages (from mercantile->contextily) (8.1.3)
Requirement already satisfied: affine in /usr/local/lib/python3.10/dist-packages (from rasterio->contextily) (2.4.0)
Requirement already satisfied: attrs in /usr/local/lib/python3.10/dist-packages (from rasterio->contextily) (23.1.0)
Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from rasterio->contextily) (2022.12.7)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.10/dist-packages (from rasterio->contextily) (0.7.2)
Requirement already satisfied: snuggs>=1.4.1 in /usr/local/lib/python3.10/dist-packages (from rasterio->contextily) (1.4.7)
Requirement already satisfied: click-plugins in /usr/local/lib/python3.10/dist-packages (from rasterio->contextily) (1.1.1)
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from rasterio->contextily) (67.7.2)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests->contextily) (1.26.15)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests->contextily) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->contextily) (3.4)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.7->matplotlib->contextily) (1.16.0)
Installing collected packages: xyzservices, mercantile, contextily
Successfully installed contextily-1.3.0 mercantile-1.2.1 xyzservices-2023.5.0
In [4]:
# Set up directories 
print("Connected to data directory: " + rootdir)

# path to the directory where we saved the downloaded Sentinel-2 images last time
datadir = join(rootdir, "download")
print("Downloaded image data are in: " + datadir)

# path to your temporary drive on the Colab Virtual Machine
cd = "/content/work"

# directory for downloading the Sentinel-2 granules
downloaddir = join(cd, 'leicester') # where we save the full leicestershire images
quickdir = join(cd, 'aoi_leicester')  # where we save the smaller leicestershire AOI
outdir = join(cd, 'out') # where we save any other outputs

# CAREFUL: This code removes the named directories and everything inside them to free up space
try:
  shutil.rmtree(downloaddir)
except:
  print(downloaddir + " not found.")

try:
  shutil.rmtree(quickdir)
except:
  print(quickdir + " not found.")

try:
  shutil.rmtree(outdir)
except:
  print(outdir + " not found.")

# create the new directories, unless they already exist
os.makedirs(cd, exist_ok=True)
os.makedirs(downloaddir, exist_ok=True)
os.makedirs(quickdir, exist_ok=True)
os.makedirs(outdir, exist_ok=True)
Connected to data directory: /content/drive/MyDrive/GY7709-2022-23
Downloaded image data are in: /content/drive/MyDrive/GY7709-2022-23/download
/content/work/leicester not found.
/content/work/aoi_leicester not found.
/content/work/out not found.
In [5]:
# Connect to Google Earth Engine API
!earthengine authenticate

ee.Initialize()
To authorize access needed by Earth Engine, open the following URL in a web browser and follow the instructions. If the web browser does not start automatically, please manually browse the URL below.

    https://code.earthengine.google.com/client-auth?scopes=https%3A//www.googleapis.com/auth/earthengine%20https%3A//www.googleapis.com/auth/devstorage.full_control&request_id=AcAsorwlBPbzvAA9ShvS0SsDbued2R37d4bzJkuyPXQ&tc=Q7eQN7DVwtR8y0W0rSUqN7SzvvTAWVLSfLUWNoNOG5w&cc=_lN9twZU5yNTv9M0UqSFnzPhX6x1LRayNZEuza-38vU

The authorization workflow will generate a code, which you should paste in the box below.
Enter verification code: 4/1AbUR2VOszOWT7lM-EqLQArE3RuPh7PeKBiqxgfbhbG1IqeSAX5-dvz9TyCU

Successfully saved authorization token.
In [6]:
# define GEE Sentinel-2 
# using S2_HARMONIZED as GEE has already correct the radiometric offset
# please see : https://developers.google.com/earth-engine/datasets/catalog/COPERNICUS_S2_HARMONIZED for more information 
s2collection = ('COPERNICUS/S2_HARMONIZED')

Important Functions Used Throughout the Code¶

Calculating NDWI¶

’’’¶

The following block of code is modified from: Balzter, H. (2023) PYGGE https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

In [7]:
# take in a geotiff file, calculate NDWI, output a geotiff (modified from pygge)
def calc_ndwi(monthly_file, outfile):
  '''
  Calculates NDWI from a geotiff file and saves it as a Geotiff raster.

  Args:
    monthly_file = string with directory path and filename of the raster file
    outfile = string of the output filename and directory path for the NDWI raster file
  '''

  # open file
  monthly_file_op = rasterio.open(monthly_file, 'r') 

  # load the data from the red band file
  band_green = monthly_file_op.read(2)
  band_nir = monthly_file_op.read(1)

  # The Sentinel-2 bands are delivered as uint16 data type (unsigned integer 16 bits per pixel).
  # This means that we cannot do floating point calculations on them without first converting them to float.
  # Convert the band arrays to float:
  band_green = np.float32(band_green)
  band_nir = np.float32(band_nir)

  # Calculate the Water index. This is done pixel by pixel using the NumPy masked array arithmetic.
  
  # We need to handle exceptions to the calculation. Where the sum of the two bands
  # in the denominator is zero (NIR+Green), the NDWI formula would give an error otherwise.
  # We do this by setting the NumPy error state to 'ignore' for this calculation only:
  # https://numpy.org/doc/stable/reference/generated/numpy.errstate.html
  
  with np.errstate(divide='ignore'): # this only applies to the following indented lines of code
    # NDWI formula:
    ndwi = np.divide(np.subtract(band_nir,band_green), np.add(band_nir, band_green)) # ignore division by zero errors here
    ndwi[(band_nir + band_green) == 0] = 0 # where NIR + Green is zero, set the NDWI to zero

  # save the NDWI image
  # open outfile and set similar to the monthly file we used to calc NDWI  
  outfile = rasterio.open(outfile, 'w', driver="GTiff", width=monthly_file_op.width, 
                          height=monthly_file_op.height, count=1, crs=monthly_file_op.crs, 
                          transform=monthly_file_op.transform, dtype=np.float32)
  outfile.write(ndwi, 1)
  outfile.close()

  return()

Calculating Differences in NDWI¶

In [8]:
# take in 2 geotiff NDWI file, calculate NDWI difference, and output a geotiff (modified from pygge)
def calc_ndwi_diff(start_file, end_file, outfile):
  '''
  Calculates NDWI difference from two files and saves it as a Geotiff raster.

  Args:
    start_file = string with directory path and filename of the raster file that we will subtract from
    end_file = string with directory path and filename of the raster file that subtracts from the start_file
    outfile = string of the output filename and directory path for the NDWI difference raster file
  '''

  # open and read files
  start_file = rasterio.open(start_file, 'r') 
  start_ndwi = start_file.read(1)
  
  end_file = rasterio.open(end_file, 'r') 
  end_ndwi = end_file.read(1)

  # calculate diff
  ndwi_diff = np.subtract(start_ndwi,end_ndwi) 
    
  # save the NDWI difference image
  outfile = rasterio.open(outfile, 'w', driver="GTiff", width=start_file.width, 
                          height=start_file.height, count=1, crs=start_file.crs, 
                          transform=start_file.transform, dtype=np.float32)
  outfile.write(ndwi_diff, 1)
  outfile.close()

  return()

Calculating Zonal Stats¶

’’’¶

The following block of code orginates from: Balzter, H. (2022) pygge (22 December 2022) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

In [9]:
# code from pygge
def easy_zonal_stats(rasterfiles, shapefile, statisticsfile, nodata=0, 
                     stats="mean std"):
  '''
  Extracts zonal statistics using rasterstats from a list of raster files and saves
    a single statisticsfile in .csv format as output.

  Args:
    rasterfiles = list of strings with raster file names with full direcroy paths
    shapefile = string giving the path and file name of the shapefile
    statisticsfile = string with the name of the output .csv file
    nodata (optional) = nodata value to be ignored in the calculation
    stats (optional) = string defining which statistics are calculated

  Returns: a Pandas dataframe with the statistics results saved into the statisticsfile
  '''

  # iterate over all raster files and extract zonal statistics
  for x in range(len(rasterfiles)):
    f = rasterfiles[x]
    # extract the string between the last slash and the dot in the file name and directory path as scene ID
    scene_id = f.split("/")[-1].split(".")[0]

    # get zonal statistics for all polygons in the shapefile
    # the result is a list of dictionaries
    statistics = zonal_stats(shapefile, f, nodata=nodata, stats=stats)

    # get number of rows for the output file as the product of n * m
    n = len(statistics) # number of polygons
    m = len(rasterfiles) # number of files

    if x == 0:
      # create the pandas dataframe in the first iteration with column names only
      df = pd.DataFrame(columns=['scene_id'].append(stats.split(" ")))

    # add the scene ID to each row in the statistics output from this scene (image)
    for row in range(n):
      statistics[row].update({"scene_id": scene_id})
    
    # append rows to the dataframe in each iteration with the scene_id as index
    for s in statistics: # iterate over the list of dictionaries (one for each polygon)
      df = df.append(s, ignore_index=True)

  # After the loop, write the statistics dataframe to a .csv file (overwrite if exists)
  df.to_csv(statisticsfile, index=False)

  return(df)

Acquiring Data for all of Leicestershire¶

Code for Search Parameters for Image Acqisition¶

’’’¶

The following code blocks orginate from: Balzter, H. (2023) practical_week33_zonal_statistics.ipynb (17 January 2023) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

In [10]:
# shapefile for all of leicestershire
shapefile = join(rootdir, 'leicester', 'leicestershire.shp') # ESRI Shapefile of the study area

# check whether the shapefile exists
if os.path.exists(shapefile):
  print('Shapefile found: '+shapefile)
else:
  print('ERROR: Shapefile not found: '+shapefile)
  print('Upload a shapefile to your Google Drive directory: '+ rootdir)

# Get the shapefile layer's extent, CRS and EPSG code
extent, outSpatialRef, epsg = pygge.get_shp_extent(shapefile)
print("Extent of the area of interest (shapefile):\n", extent)
print(type(extent))
print("\nCoordinate referencing system (CRS) of the shapefile:\n", outSpatialRef)
print('EPSG code: ', epsg)
Shapefile found: /content/drive/MyDrive/GY7709-2022-23/leicester/leicestershire.shp
Extent of the area of interest (shapefile):
 (-1.5975412373, -0.6641016341, 52.3921710451, 52.9776790938)
<class 'tuple'>

Coordinate referencing system (CRS) of the shapefile:
 GEOGCS["WGS 84",
    DATUM["WGS_1984",
        SPHEROID["WGS 84",6378137,298.257223563,
            AUTHORITY["EPSG","7030"]],
        AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0,
        AUTHORITY["EPSG","8901"]],
    UNIT["degree",0.0174532925199433,
        AUTHORITY["EPSG","9122"]],
    AXIS["Latitude",NORTH],
    AXIS["Longitude",EAST],
    AUTHORITY["EPSG","4326"]]
EPSG code:  4326
In [11]:
# GEE needs a special format for defining an area of interest. 
# It has to be a GeoJSON Polygon and the coordinates should be first defined in a list and then converted using ee.Geometry. 
extent_list = list(extent)
print(extent_list)
print(type(extent_list))
# close the list of polygon coordinates by adding the starting node at the end again
# and make list elements in the form of coordinate pairs (y,x)
area_list = list([(extent[0], extent[2]),(extent[1], extent[2]),(extent[1], extent[3]),(extent[0], extent[3]),(extent[0], extent[2])])
print(area_list)
print(type(area_list))

search_area = ee.Geometry.Polygon(area_list)
print(search_area)
print(type(search_area))
[-1.5975412373, -0.6641016341, 52.3921710451, 52.9776790938]
<class 'list'>
[(-1.5975412373, 52.3921710451), (-0.6641016341, 52.3921710451), (-0.6641016341, 52.9776790938), (-1.5975412373, 52.9776790938), (-1.5975412373, 52.3921710451)]
<class 'list'>
ee.Geometry({
  "functionInvocationValue": {
    "functionName": "GeometryConstructors.Polygon",
    "arguments": {
      "coordinates": {
        "constantValue": [
          [
            [
              -1.5975412373,
              52.3921710451
            ],
            [
              -0.6641016341,
              52.3921710451
            ],
            [
              -0.6641016341,
              52.9776790938
            ],
            [
              -1.5975412373,
              52.9776790938
            ],
            [
              -1.5975412373,
              52.3921710451
            ]
          ]
        ]
      },
      "evenOdd": {
        "constantValue": true
      }
    }
  }
})
<class 'ee.geometry.Geometry'>

Map of Study Area¶

In [12]:
# loading shapefile via geopandas
study_area = gpd.read_file(shapefile)
# plot format
fig, ax = plt.subplots(figsize=(10,10))
ax.set_title("Leicestershire, UK")

# plot study area 
# OSM epsg is 3857
study_area.to_crs(epsg=3857).plot(ax=ax, facecolor='none', edgecolor='red', linewidth=3)
# add OSM basemap
cx.add_basemap(ax, source=cx.providers.OpenStreetMap.Mapnik)

# show plot
plt.show()

Image Acquisition¶

’’’¶

The following blocks of code is modified from: Balzter, H. (2023) practical_week30_Sentinel-2_GEE.ipynb (17 January 2023) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

In [13]:
# Obtain monthly image composites

# change directory to download directory
os.chdir(downloaddir)

# make a list of lists with all date ranges for our new searches
months = [['2022-04-01', '2022-04-30'],
          ['2022-05-01', '2022-05-31'],
          ['2022-06-01', '2022-06-30'],
          ['2022-07-01', '2022-07-31'],
          ['2022-08-01', '2022-08-31'],
          ['2022-09-01', '2022-09-30'],
          ['2022-10-01', '2022-10-31']]

# set cloud cover threshold
clouds = 10

# band names for download, a list of strings
# only download G bands and NIR
bands = ['B3',  'B8']

# spatial resolution of the downloaded data
# cannot do much less than this 
resolution = 50 # in units of metres

# iterate over the months
for month in range(len(months)):
  time_range = months[month]
  print(time_range)

  # do the search on Google Earth Engine

  # image
  s2median = pygge.obtain_image_sentinel(s2collection, time_range, search_area, clouds)

  # print out the band names of the image composite that was returned by our search
  band_names = s2median.bandNames().getInfo()

  # check whether the search returned any imagery
  if len(band_names) == 0:
    print("Search returned no results.")

  else:
    # print all band names  
    print(band_names)

    # file name will start with the year and month 
    date = time_range[0].split("-")
    file_id = date[0] + "-" + date[1]
    
    search_region = pygge.get_region(search_area)
    s2url = pygge.get_url(file_id + "_s2", s2median.select(bands), resolution, search_region, filePerBand=False)
    print(s2url)

    # request information on the file to be downloaded
    f = pygge.requests.get(s2url, stream =True)

    # check whether it is a zip file
    check = zipfile.is_zipfile(io.BytesIO(f.content))

    # either download the file as is, or unzip it
    while not check:
        f = pygge.requests.get(s2url, stream =True)
        check = zipfile.is_zipfile(io.BytesIO(f.content))
    else:
        z = zipfile.ZipFile(io.BytesIO(f.content))
        z.extractall()
['2022-04-01', '2022-04-30']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/3afb2820bfddfa38d42e96c6e9b29a08-8fc7c534a0887dffe22b02bc180d781f:getPixels
['2022-05-01', '2022-05-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/9a54a87d5dd6597fefd06bea64f1ee88-fc941e8c63a2066c9727a2e6bbd9b634:getPixels
['2022-06-01', '2022-06-30']
Search returned no results.
['2022-07-01', '2022-07-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/b55c76d319d1266ac05ba09f722f322d-d922f269e779bcca30c2be258eb86557:getPixels
['2022-08-01', '2022-08-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/ca614d18dadadcf77af7608fd8f71a83-1e91eaebea231fc7deb7e93d5574f37b:getPixels
['2022-09-01', '2022-09-30']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/a6339f41209cd4ed884a86357179f320-4e07535b25d1d53df8cce4d58ae88202:getPixels
['2022-10-01', '2022-10-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/50ad0153012a6ed37ab3ec24c97b66f6-5eff8ae8bc426b746a5d0194de540c2b:getPixels

Calculate NDWI for Each Monthly Image Composite¶

Using the calc_ndwi function, we will now calculate ndwi for all images

In [14]:
# get list of all tiff files in the directory
allfiles = [f for f in listdir(downloaddir) if isfile(join(downloaddir, f))]
print("all files in leicester directory: ")
for f in sorted(allfiles):
    print(f)
all files in leicester directory: 
2022-04_s2.tif
2022-05_s2.tif
2022-07_s2.tif
2022-08_s2.tif
2022-09_s2.tif
2022-10_s2.tif
In [15]:
# list for to remember directory path to newly
# calculated NDWI files 
ndwifiles = []

# iterate over all Sentinel-2 images and calculate NDWI
for file in allfiles:
  # create an output file name
  file_name = file.split("_")[0]
  ndwifile = join(downloaddir, file_name + "_ndwi.tif")
  # remember the file name we created
  ndwifiles.append(ndwifile)
  # calculate the NDWI 
  calc_ndwi(file, ndwifile)

print("List of all NDWI image files:")
for i in ndwifiles:
  print(i)
WARNING:rasterio._env:CPLE_AppDefined in 2022-04_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
<ipython-input-7-1c44162413a9>:33: RuntimeWarning: invalid value encountered in true_divide
  ndwi = np.divide(np.subtract(band_nir,band_green), np.add(band_nir, band_green)) # ignore division by zero errors here
WARNING:rasterio._env:CPLE_AppDefined in 2022-05_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-07_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-10_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-08_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-09_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
List of all NDWI image files:
/content/work/leicester/2022-04_ndwi.tif
/content/work/leicester/2022-05_ndwi.tif
/content/work/leicester/2022-07_ndwi.tif
/content/work/leicester/2022-10_ndwi.tif
/content/work/leicester/2022-08_ndwi.tif
/content/work/leicester/2022-09_ndwi.tif

Plotting Images¶

In [16]:
# creating format for figures 
# 2 rows and 3 columns for figures 
fig, ax = plt.subplots(2,3, figsize=(10,6))
fig.patch.set_facecolor('white')
ax = np.reshape(ax, 2*3)

# plot maps 
for i, file in enumerate(sorted(ndwifiles)):
    # create a title from file name 
    title = (file.split("/")[4]).split(".")[0]
    pygge.easy_plot(file, ax[i], bands=[1], percentiles=[0,99], cmap='Blues', title=title)

Due to half of these images being highly affected by cloud cover and a coarse spatial resolution, we will take a smaller area of interest within Leicestershire.

Smaller Area of Interest within Leicestershire¶

Redefine Search Parameters¶

’’’¶

The following blocks of code orginates from: Balzter, H. (2023) practical_week33_zonal_statistics.ipynb (17 January 2023) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

In [17]:
# Set new shapefile 
shapefile_new = join(rootdir, 'leicester', 'new_study_area.shp') # ESRI Shapefile of the study area

# check whether the shapefile exists
if os.path.exists(shapefile_new):
  print('Shapefile found: '+shapefile_new)
else:
  print('ERROR: Shapefile not found: '+shapefile_new)
  print('Upload a shapefile to your Google Drive directory: '+ rootdir)
Shapefile found: /content/drive/MyDrive/GY7709-2022-23/leicester/new_study_area.shp
In [18]:
# Get the shapefile layer's extent, CRS and EPSG code
extent_new, outSpatialRef_new, epsg_new = pygge.get_shp_extent(shapefile_new)
print("Extent of the area of interest (shapefile):\n", extent_new)
print(type(extent_new))
print("\nCoordinate referencing system (CRS) of the shapefile:\n", outSpatialRef_new)
print('EPSG code: ', epsg_new)
Extent of the area of interest (shapefile):
 (-1.596963771229389, -1.3720196731836163, 52.522504895477326, 52.84801365490603)
<class 'tuple'>

Coordinate referencing system (CRS) of the shapefile:
 GEOGCS["WGS 84",
    DATUM["WGS_1984",
        SPHEROID["WGS 84",6378137,298.257223563,
            AUTHORITY["EPSG","7030"]],
        AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0,
        AUTHORITY["EPSG","8901"]],
    UNIT["degree",0.0174532925199433,
        AUTHORITY["EPSG","9122"]],
    AXIS["Latitude",NORTH],
    AXIS["Longitude",EAST],
    AUTHORITY["EPSG","4326"]]
EPSG code:  4326
In [19]:
# GEE needs a special format for defining an area of interest. 
# It has to be a GeoJSON Polygon and the coordinates should be first defined in a list and then converted using ee.Geometry. 
extent_list = list(extent_new)
print(extent_list)
print(type(extent_list))

# close the list of polygon coordinates by adding the starting node at the end again
# and make list elements in the form of coordinate pairs (y,x)
area_list = list([(extent_new[0], extent_new[2]),(extent_new[1], extent_new[2]),(extent_new[1], extent_new[3]),(extent_new[0], extent_new[3]),(extent_new[0], extent_new[2])])
print(area_list)
print(type(area_list))

search_area_new = ee.Geometry.Polygon(area_list)
print(search_area_new)
print(type(search_area_new))
[-1.596963771229389, -1.3720196731836163, 52.522504895477326, 52.84801365490603]
<class 'list'>
[(-1.596963771229389, 52.522504895477326), (-1.3720196731836163, 52.522504895477326), (-1.3720196731836163, 52.84801365490603), (-1.596963771229389, 52.84801365490603), (-1.596963771229389, 52.522504895477326)]
<class 'list'>
ee.Geometry({
  "functionInvocationValue": {
    "functionName": "GeometryConstructors.Polygon",
    "arguments": {
      "coordinates": {
        "constantValue": [
          [
            [
              -1.596963771229389,
              52.522504895477326
            ],
            [
              -1.3720196731836163,
              52.522504895477326
            ],
            [
              -1.3720196731836163,
              52.84801365490603
            ],
            [
              -1.596963771229389,
              52.84801365490603
            ],
            [
              -1.596963771229389,
              52.522504895477326
            ]
          ]
        ]
      },
      "evenOdd": {
        "constantValue": true
      }
    }
  }
})
<class 'ee.geometry.Geometry'>

Map Study Area¶

In [20]:
# loading shapefile via geopandas
smaller_study_area = gpd.read_file(shapefile_new)
# plot format
fig, ax = plt.subplots(figsize=(10,10))
ax.set_title("Smaller AOI in Leicestershire, UK")

# plot study area 
# OSM epsg is 3857
smaller_study_area.to_crs(epsg=3857).plot(ax=ax, facecolor='none', edgecolor='red', linewidth=5)
# add OSM basemap
cx.add_basemap(ax, source=cx.providers.OpenStreetMap.Mapnik)

# show plot
plt.show()

Image Composites Acquisition¶

’’’¶

The following blocks of code is modified from: Balzter, H. (2023) practical_week30_Sentinel-2_GEE.ipynb (17 January 2023) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

In [21]:
# Obtain monthly image composites

# change directory to download directory
os.chdir(quickdir)

# make a list of lists with all date ranges for our new searches
# additionally acquiring 2021-07 for severity purposes 
months = [['2021-07-01', '2021-07-31'],
          ['2022-04-01', '2022-04-30'],
          ['2022-05-01', '2022-05-31'],
          ['2022-06-01', '2022-06-30'],
          ['2022-07-01', '2022-07-31'],
          ['2022-08-01', '2022-08-31'],
          ['2022-09-01', '2022-09-30'],
          ['2022-10-01', '2022-10-31']]

# set cloud cover threshold
clouds = 10

# band names for download, a list of strings
# only download G bands and NIR
bands = ['B3', 'B8']

# spatial resolution of the downloaded data
# lowest amount that I could go with the data 
resolution = 15 # in units of metres

# iterate over the months
for month in range(len(months)):
  time_range = months[month]
  print(time_range)

  # do the search on Google Earth Engine

  # image search on GEE
  s2median = pygge.obtain_image_sentinel(s2collection, time_range, search_area_new, clouds)

  # print out the band names of the image composite that was returned by our search
  band_names = s2median.bandNames().getInfo()

  # check whether the search returned any imagery
  if len(band_names) == 0:
    print("Search returned no results.")

  else:
    # print all band names  
    print(band_names)

    # begin the file name with year and month 
    date = time_range[0].split("-")
    file_id = date[0] + "-" + date[1]

    search_region = pygge.get_region(search_area_new)
    s2url = pygge.get_url(file_id + "_s2", s2median.select(bands), resolution, search_region, filePerBand=False)
    print(s2url)

    # request information on the file to be downloaded
    f = pygge.requests.get(s2url, stream =True)

    # check whether it is a zip file
    check = zipfile.is_zipfile(io.BytesIO(f.content))

    # either download the file as is, or unzip it
    while not check:
        f = pygge.requests.get(s2url, stream =True)
        check = zipfile.is_zipfile(io.BytesIO(f.content))
    else:
        z = zipfile.ZipFile(io.BytesIO(f.content))
        z.extractall()
['2021-07-01', '2021-07-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/4b50b7a5b5b93751038f5e9c81c01583-15e58e503bc230d1081799d92a3f4e8e:getPixels
['2022-04-01', '2022-04-30']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/64c37063d09d93f756ec6c1323af0899-bbe2dc9502b9d03d84269c12130b9e1d:getPixels
['2022-05-01', '2022-05-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/3b8a04b334ad87d8844b4addd0981916-e6ae803c60152d76f28a5221b92af235:getPixels
['2022-06-01', '2022-06-30']
Search returned no results.
['2022-07-01', '2022-07-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/63286024ca8c64674e29af7a3177be8b-005e84aaee1ab4158a3cffa36132d16c:getPixels
['2022-08-01', '2022-08-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/78affa3ec27d67d94133778147fba01f-d26206842276ae8312a526386b3123e5:getPixels
['2022-09-01', '2022-09-30']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/51290dca227d9a7ed222b3f8a47c9b5e-10a39b42d4282a56b603ef3477d8083b:getPixels
['2022-10-01', '2022-10-31']
['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B10', 'B11', 'B12', 'QA10', 'QA20', 'QA60']
https://earthengine.googleapis.com/v1alpha/projects/earthengine-legacy/thumbnails/366e4d4c9292614686a091acea2a537d-115aae561c3ba5e7993ad949816e4e4f:getPixels

Calculate NDWI¶

In [22]:
# get list of all tiff files in the directory
allfiles = [f for f in listdir(quickdir) if isfile(join(quickdir, f))]

# print amount of files within current dir (quickdir)
x = len(allfiles)
print("There are {} files in aoi_leciester directory".format(len(allfiles)))

# print each file 
print("all files in aoi_leciester directory are: ")
for f in sorted(allfiles):
    print(f)
There are 7 files in aoi_leciester directory
all files in aoi_leciester directory are: 
2021-07_s2.tif
2022-04_s2.tif
2022-05_s2.tif
2022-07_s2.tif
2022-08_s2.tif
2022-09_s2.tif
2022-10_s2.tif
In [23]:
# list for to remember directory path to newly
# calculated NDWI files  
ndwifiles = []

# iterate over all Sentinel-2 images and calculate NDWI
for file in allfiles:
  # create an output file name
  # i.e. year - month 
  file_name = file.split("_")[0]
  # make new file var that will point to 
  # the new creately ndwi file 
  ndwifile = join(quickdir, file_name + "_ndwi.tif")
  # remember the file name we created
  ndwifiles.append(ndwifile)
  # calculate the NDWI 
  calc_ndwi(file, ndwifile)

print("List of all NDWI image files paths:")
for i in sorted(ndwifiles):
  print(i)
WARNING:rasterio._env:CPLE_AppDefined in 2022-04_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
<ipython-input-7-1c44162413a9>:33: RuntimeWarning: invalid value encountered in true_divide
  ndwi = np.divide(np.subtract(band_nir,band_green), np.add(band_nir, band_green)) # ignore division by zero errors here
WARNING:rasterio._env:CPLE_AppDefined in 2022-05_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-07_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-10_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-08_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2022-09_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
WARNING:rasterio._env:CPLE_AppDefined in 2021-07_s2.tif: TIFFReadDirectory:Sum of Photometric type-related color channels and ExtraSamples doesn't match SamplesPerPixel. Defining non-color channels as ExtraSamples.
List of all NDWI image files paths:
/content/work/aoi_leicester/2021-07_ndwi.tif
/content/work/aoi_leicester/2022-04_ndwi.tif
/content/work/aoi_leicester/2022-05_ndwi.tif
/content/work/aoi_leicester/2022-07_ndwi.tif
/content/work/aoi_leicester/2022-08_ndwi.tif
/content/work/aoi_leicester/2022-09_ndwi.tif
/content/work/aoi_leicester/2022-10_ndwi.tif

Plot Maps¶

In this section, we will warp, clip, and convert files¶

’’’¶

The following blocks of code is modified from: Balzter, H. (2023) practical_week30_Sentinel-2_GEE.ipynb (17 January 2023) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

  1. Warping is important as we need to reproject to the same projection as our shapefile otherwise we will not be able to clip properly.
  2. Converting the files to unsigned int 8 to effectively make a movie of all NDWI images
In [24]:
# create and empty list for the newly created uint8 data file names
uint8files = []

# now warp them all by creating a FOR loop over all files in our list
for f in sorted(ndwifiles):
  # make a file name for our new files
  warpfile = f.split('.')[0]+'_warped.tif'
  uint8file = f.split('.')[0]+'_warped_uint8.tif'
  
  # call the easy_warp function
  pygge.easy_warp(f, warpfile, epsg)
  
  # convert to uint8 data type and cap the pixel values at 2000
  pygge.convert_to_dtype(warpfile, uint8file, np.uint8, percentiles=[0,98])
  
  # add the new file name to the list of file names
  uint8files.append(uint8file)

  # create thumbnails for quality checking
  fig, ax = plt.subplots(1,2, figsize=(5,2.5))
  # ensuring good spacing between plots 
  fig.tight_layout()
  fig.patch.set_facecolor('white')
  # removing the directory from from title names for the plots 
  title = (warpfile.split("/")[4]).split("_")[0]

  # plotting 
  pygge.easy_plot(warpfile, ax=ax[0],bands=[1], percentiles=[0,99], cmap = 'Blues',title=title + " warped NDWI")
  pygge.easy_plot(uint8file, ax=ax[1],bands=[1], percentiles=[0,100], cmap = 'Blues', title=title + " uint8 NDWI")
  
# after downloading and warping all image composites, get a list of all warped tiff files in the directory
allfiles = [f for f in listdir(quickdir) if isfile(join(quickdir, f))]
warpfiles = [s for s in allfiles if "_warped.tif" in s]

print("Files after warping:")
pprint(sorted(warpfiles))

print("Files after conversion to uint8 data type:")
pprint(sorted(uint8files))
Creating warped file:/content/work/aoi_leicester/2021-07_ndwi_warped.tif
Scaling from [-0.9963591694831848,-0.12009626969695086] to [0, 255]
Creating warped file:/content/work/aoi_leicester/2022-04_ndwi_warped.tif
Scaling from [-0.9775281548500061,0.0] to [0, 255]
Creating warped file:/content/work/aoi_leicester/2022-05_ndwi_warped.tif
Scaling from [-1.0,-0.04101983398199039] to [0, 255]
Creating warped file:/content/work/aoi_leicester/2022-07_ndwi_warped.tif
Scaling from [-0.7608236074447632,-0.1159757673740387] to [0, 255]
Creating warped file:/content/work/aoi_leicester/2022-08_ndwi_warped.tif
Scaling from [-1.0,-0.08239076614379846] to [0, 255]
Creating warped file:/content/work/aoi_leicester/2022-09_ndwi_warped.tif
Scaling from [-1.0,0.09991604089736938] to [0, 255]
Creating warped file:/content/work/aoi_leicester/2022-10_ndwi_warped.tif
Scaling from [-1.0,-0.011457276325672675] to [0, 255]
Files after warping:
['2021-07_ndwi_warped.tif',
 '2022-04_ndwi_warped.tif',
 '2022-05_ndwi_warped.tif',
 '2022-07_ndwi_warped.tif',
 '2022-08_ndwi_warped.tif',
 '2022-09_ndwi_warped.tif',
 '2022-10_ndwi_warped.tif']
Files after conversion to uint8 data type:
['/content/work/aoi_leicester/2021-07_ndwi_warped_uint8.tif',
 '/content/work/aoi_leicester/2022-04_ndwi_warped_uint8.tif',
 '/content/work/aoi_leicester/2022-05_ndwi_warped_uint8.tif',
 '/content/work/aoi_leicester/2022-07_ndwi_warped_uint8.tif',
 '/content/work/aoi_leicester/2022-08_ndwi_warped_uint8.tif',
 '/content/work/aoi_leicester/2022-09_ndwi_warped_uint8.tif',
 '/content/work/aoi_leicester/2022-10_ndwi_warped_uint8.tif']

Clipping to shapefile

In [25]:
# list made fpr the new clipped files 
ndwi_cliped = []


# don't want to clip 2021 July file 
# remove it from len for plotting 
nfiles = len(ndwifiles) - 1

# arrange our subplots
cols = min(nfiles, 3) # maximum of 3 plots in one row
rows = math.ceil(nfiles / cols) # round up to nearest integer

# create a figure with subplots
fig, ax = plt.subplots(rows, cols, figsize=(21,7))
fig.patch.set_facecolor('white')

# iterate over all warped raster files
for i, warpfile in enumerate(warpfiles):
  # we don't want to clip 2021 July file right now 
  if "2021" not in warpfile: 
    print(warpfile)

    # make the filename of the new zoom image file
    clipfile = warpfile.split(".")[0] + "_aoi_clip.tif"
    ndwi_cliped.append(clipfile) # remember the zoom file name in our list
    print("Clipped file: ", clipfile)

    # clip it to the shapefile_new extent
    pygge.easy_clip(warpfile, clipfile, extent_new)

# make maps
if len(ndwi_cliped) == 1:
  pygge.easy_plot(clipfile, ax, bands=[1], cmap="Blues", percentiles=[0,100],
                  shapefile=shapefile_new, linecolor="yellow", 
                  title = clipfile.split("/")[-1].split(".")[0], fontsize=8)
else:
  if rows > 1:
    ax = np.reshape(ax, cols*rows)
  for i, clipfile in enumerate(sorted(ndwi_cliped)):
    pygge.easy_plot(clipfile, ax[i], bands=[1], cmap="Blues", percentiles=[0,100],
                  shapefile=shapefile_new, linecolor="yellow", 
                  title = (clipfile.split("/")[-1].split(".")[0]).split("_")[0] + " NDWI", fontsize=8)
    
os.chdir(quickdir)
!ls -l
2022-09_ndwi_warped.tif
Clipped file:  2022-09_ndwi_warped_aoi_clip.tif
Window coordinates:  Window(col_off=0, row_off=0, width=1670, height=2416)
Window array size:  (1, 2416, 1670)
2022-10_ndwi_warped.tif
Clipped file:  2022-10_ndwi_warped_aoi_clip.tif
Window coordinates:  Window(col_off=0, row_off=0, width=1670, height=2416)
Window array size:  (1, 2416, 1670)
2022-04_ndwi_warped.tif
Clipped file:  2022-04_ndwi_warped_aoi_clip.tif
Window coordinates:  Window(col_off=0, row_off=0, width=1670, height=2416)
Window array size:  (1, 2416, 1670)
2022-07_ndwi_warped.tif
Clipped file:  2022-07_ndwi_warped_aoi_clip.tif
Window coordinates:  Window(col_off=0, row_off=0, width=1670, height=2416)
Window array size:  (1, 2416, 1670)
2022-08_ndwi_warped.tif
Clipped file:  2022-08_ndwi_warped_aoi_clip.tif
Window coordinates:  Window(col_off=0, row_off=0, width=1670, height=2416)
Window array size:  (1, 2416, 1670)
2022-05_ndwi_warped.tif
Clipped file:  2022-05_ndwi_warped_aoi_clip.tif
Window coordinates:  Window(col_off=0, row_off=0, width=1670, height=2416)
Window array size:  (1, 2416, 1670)
total 463444
-rw-r--r-- 1 root root 16153742 May 20 14:34 2021-07_ndwi.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2021-07_ndwi_warped.tif
-rw-r--r-- 1 root root  4038710 May 20 14:34 2021-07_ndwi_warped_uint8.tif
-rw-r--r-- 1 root root 17078381 May 20 14:33 2021-07_s2.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-04_ndwi.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-04_ndwi_warped_aoi_clip.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-04_ndwi_warped.tif
-rw-r--r-- 1 root root  4038710 May 20 14:34 2022-04_ndwi_warped_uint8.tif
-rw-r--r-- 1 root root 17174300 May 20 14:33 2022-04_s2.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-05_ndwi.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-05_ndwi_warped_aoi_clip.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-05_ndwi_warped.tif
-rw-r--r-- 1 root root  4038710 May 20 14:34 2022-05_ndwi_warped_uint8.tif
-rw-r--r-- 1 root root 17283736 May 20 14:33 2022-05_s2.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-07_ndwi.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-07_ndwi_warped_aoi_clip.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-07_ndwi_warped.tif
-rw-r--r-- 1 root root  4038710 May 20 14:34 2022-07_ndwi_warped_uint8.tif
-rw-r--r-- 1 root root 18563928 May 20 14:33 2022-07_s2.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-08_ndwi.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-08_ndwi_warped_aoi_clip.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-08_ndwi_warped.tif
-rw-r--r-- 1 root root  4038710 May 20 14:34 2022-08_ndwi_warped_uint8.tif
-rw-r--r-- 1 root root 18339044 May 20 14:33 2022-08_s2.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-09_ndwi.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-09_ndwi_warped_aoi_clip.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-09_ndwi_warped.tif
-rw-r--r-- 1 root root  4038710 May 20 14:34 2022-09_ndwi_warped_uint8.tif
-rw-r--r-- 1 root root 17555236 May 20 14:34 2022-09_s2.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-10_ndwi.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-10_ndwi_warped_aoi_clip.tif
-rw-r--r-- 1 root root 16153742 May 20 14:34 2022-10_ndwi_warped.tif
-rw-r--r-- 1 root root  4038710 May 20 14:34 2022-10_ndwi_warped_uint8.tif
-rw-r--r-- 1 root root 17166234 May 20 14:34 2022-10_s2.tif

Make a movie of the images

In [26]:
import imageio

# create an empty Numpy array where we will merge all raster images
images = []

# iterate over sorted all converted files
for f in sorted(uint8files):
  # not including 2021 file in movie
  if "2021" not in f:
    print(f)      
    images.append(imageio.imread(f)) # read the next image and append it

# set the frame rate in seconds
framerate = { 'duration': 2 }

# save the movie
imageio.mimsave(join(outdir, "movie.gif"), images, **framerate)
/content/work/aoi_leicester/2022-04_ndwi_warped_uint8.tif
/content/work/aoi_leicester/2022-05_ndwi_warped_uint8.tif
/content/work/aoi_leicester/2022-07_ndwi_warped_uint8.tif
/content/work/aoi_leicester/2022-08_ndwi_warped_uint8.tif
/content/work/aoi_leicester/2022-09_ndwi_warped_uint8.tif
/content/work/aoi_leicester/2022-10_ndwi_warped_uint8.tif
<ipython-input-26-9f86eae15216>:11: DeprecationWarning: Starting with ImageIO v3 the behavior of this function will switch to that of iio.v3.imread. To keep the current behavior (and make this warning disappear) use `import imageio.v2 as imageio` or call `imageio.v2.imread` directly.
  images.append(imageio.imread(f)) # read the next image and append it

Pixel Frequency of each NDWI warped file¶

This allows for further investigation on the distrbution of NDWI values

In [27]:
# arrange our plots 
fig,ax = plt.subplots(7,1, figsize=(5,35))
plt.style.use('ggplot')

# iterate over sorted warped files 
for i, f in enumerate(sorted(warpfiles)):
    
    # open and read bands of file
    raster = rasterio.open(f)
    bands = raster.read()
    # flatten to plot 
    bands_flat = bands.flatten()

    # year-month for fig title 
    title = f.split("_")[0]

    # plot
    ax[i].hist(bands_flat, bins=256, color="blue")
    ax[i].set_title(title + " Frequency of NDWI Pixel Values")

# show plots 
plt.show()

Make a Map showcasing the most severe impacts of the drought¶

To do so, we will take the difference NDWI of each month from July 2022 (drought month) (e.g. (2022-07 NDWI) - (2021-07 NDWI)). Therefore, the lowest values will be the most impacted locations as July 2022 will highlight the effects of the drought and subtracting it from each month will highlight the deviation of normalacy. The difference map will show the MOST impacted areas in the red/yellow colors. (Please note, there is cloud cover in some of these maps)

In [ ]:
# list of ndwi diff files 
ndwi_diff_files = [] 

# define july 2022 ndwi file first 
for i, file in enumerate(warpfiles):
    if "2022-07" in file:
        july_2022_ndwi = warpfiles[i]

for i, file in enumerate(sorted(warpfiles)):
    # already defined so go to next file 
    if  "2022-07" in file:
        continue;
    else:
        file_title = (file.split("_")[0] + "_diff") 
        ndwi_diff_file = join(quickdir, file_title+".tif")
        ndwi_diff_files.append(ndwi_diff_file)
        calc_ndwi_diff(july_2022_ndwi,file, ndwi_diff_file)
        
        fig, (ax1, ax2, ax3) = plt.subplots(1,3, figsize=(7,9))
        fig.patch.set_facecolor('white')
        fig.tight_layout()

        # plot current month NDWI maps 
        pygge.easy_plot(file, ax=ax1, bands=[1], percentiles=[0,99], cmap='Blues', title = (file_title.split("_")[0] + " NDWI"))
        # plot july 2022 NDWI map
        pygge.easy_plot(july_2022_ndwi, ax=ax2, bands=[1], percentiles=[0,99], cmap='Blues', title = "2022-07 NDWI")
        # plot difference maps 
        # highlighted areas the MOST imapacted 
        pygge.easy_plot(ndwi_diff_file,ax=ax3, bands=[1], percentiles=[0,99],cmap='RdYlBu', title = "NDWI Difference")
In [ ]:
# list of NDWI difference files
for file in ndwi_diff_files:
    print(file)

NDWI Difference Maps Plotted with Zonal Shapefile¶

In [ ]:
# clip zonal shapefile to out NDWI Severity Maps 
# make an empty list where we will remember all clipped file names
zonal_shapefile = join(rootdir, 'leicester', 'zonal_stats.shp') # ESRI Shapefile of the study area

# arrange our plots 
fig,ax = plt.subplots(2,3, figsize=(20,20))
ax = np.reshape(ax, 2*3)

# iterate over sorted warped files 
for i, f in enumerate(sorted(ndwi_diff_files)):
    # plot
    title = (f.split("/")[4]).split("_")[0]
    pygge.easy_plot(f,ax=ax[i], bands=[1], percentiles=[0,99],cmap='RdYlBu', shapefile=zonal_shapefile, linecolor="black", title = title + " NDWI Difference from July 2022")
fig.show()

Monthly Stats¶

Plot times series of mean and median NDWI within our polygon

In [ ]:
# plot mean and median for NDWI for inside our polygon  
stats="mean median"

# create df of median, mean for all monthly warped NDWI files 
df = pd.DataFrame(columns=['year_month'].append(stats.split(" ")))

# iterate over each warped file 
for f in (warpfiles):
    # uncomment if 2021 should not be tracked here
    # if "2021" not in f:

    # extract the string for start year-month 
    date_range = f.split("_")[0]

    # get zonal statistics for our polygon
    statistics = zonal_stats(shapefile_new, f, nodata=0, stats=stats)

    # keep track of the date_range with the calculated values 
    statistics[0].update({"year_month": date_range})
    
    # add stats values into the dataframe 
    df = df.append(statistics, ignore_index=True)

# dataframe of sorted monthly values
df = df.sort_values('year_month')
print(df)

# line plot of values
ax = plt.gca()
df.plot(kind='line',x='year_month',y='mean',ax=ax, color='red', title = "Time Series of NDWI Values")
df.plot(kind='line',x='year_month',y='median', color='blue', ax=ax)
plt.show()

Zonal Stats¶

’’’¶

The following blocks of code is modified from: Balzter, H. (2023) practical_week33_zonal_statistics.ipynb (17 January 2023) https://uniofleicester-my.sharepoint.com/personal/hb91_leicester_ac_uk/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23%2FGY7709%2D2022%2D23%2Ezip&parent=%2Fpersonal%2Fhb91%5Fleicester%5Fac%5Fuk%2FDocuments%2FDesktop%2FGY7709%5FSatellite%5FData%5FAnalysis%5Fin%5FPython%2F2022%2D23&ga=1 Downloaded 2 April 2023


’’’

Clipping & warping files to shapefile with smaller polygons to extract zonal stats¶

In [ ]:
# make an empty list where we will remember all clipped file names
zonal_clipfiles = [] 
zonal_shapefile = join(rootdir, 'leicester', 'zonal_stats.shp') # ESRI Shapefile of the study area

# not including 2021 July
nfiles = len(ndwifiles) - 1

# arrange our subplots, assuming a 16:9 screen ratio
cols = min(nfiles, 3) # maximum of 3 plots in one row
rows = math.ceil(nfiles / cols) # round up to nearest integer

# create a figure with subplots
fig, ax = plt.subplots(rows, cols, figsize=(21,7))
fig.patch.set_facecolor('white')

# iterate over all warped raster files
for i, warpfile in enumerate(warpfiles):
   # not including July 2021  
   if "2021" not in warpfile:
    # make the filename of the new zoom image file
    zonal_clipfile = warpfile.split(".")[0] + "_zonal_clip.tif"
    zonal_clipfiles.append(zonal_clipfile) # remember the zoom file name in our list

    # clip it to the shapefile extent
    pygge.easy_clip(warpfile, zonal_clipfile, extent)

# make maps

else:
  if rows > 1:
    ax = np.reshape(ax, cols*rows)
  for i, zonal_clipfile in enumerate(sorted(zonal_clipfiles)):
      pygge.easy_plot(zonal_clipfile, ax[i], bands=[1], cmap="Blues", percentiles=[0,100],
                  shapefile=zonal_shapefile, linecolor="yellow", 
                  title = (zonal_clipfile.split("/")[-1].split(".")[0]).split("_")[0] + " Zonal Clipped", fontsize=8)
    
os.chdir(quickdir)
!ls -l

Map of Zonal Study Area¶

In [ ]:
# loading shapefile via geopandas
zonal_study_area = gpd.read_file(zonal_shapefile)
# plot format
fig, ax = plt.subplots(figsize=(10,10))
ax.set_title("Leicestershire, UK")

# plot study area 
# OSM epsg is 3857
zonal_study_area.to_crs(epsg=3857).plot(ax=ax, facecolor='none', edgecolor='red', linewidth=3)
# add OSM basemap
cx.add_basemap(ax, source=cx.providers.OpenStreetMap.Mapnik)

# show plot
plt.show()

Creating Zonal file¶

In [ ]:
# make the name of the statistics output file
statsfile = outdir + "/" + "zonalstats.csv"

# call the pygge function for processing zonal statistics for multiple raster files
zonalstats = easy_zonal_stats(zonal_clipfiles, zonal_shapefile, statsfile, nodata=0)

# open the file
f = open(statsfile,"r") 
# read its contents (all lines)
f.read().splitlines()
# close the file
f.close()

# make the filename of the new pickle file for the stats object
pklfile = statsfile.split(".")[0] + ".pkl"

# write object to file
with open(pklfile, "wb") as f:
  pickle.dump(zonalstats, f)
print("\nPickled zonal statistics as pandas dataframe in file: " + pklfile + "\n")

Zonal Stats Plots (bar chart and line graph)¶

In [ ]:
# read the dataframe object from file (not necessary, for demonstration only)
df = pickle.load(open(pklfile, 'rb'))
print("\nPlotting statistics from file: ", pklfile)
print(df.head(5))

# get a list of unique scene IDs from the dataframe
scene_ids = sorted(df['scene_id'].unique())

# iterate over all scene IDs in the pandas dataframe
for x in scene_ids:
  # extract the rows for that scene ID from the full dataframe
  b = df.loc[df['scene_id'] == x]

  # iterate over all columns
  for key, values in b.items():
    if key != "scene_id":
      # arrange plots
      fig, (ax1, ax2) = plt.subplots(1,2, figsize=(10,5))
      plt.style.use('ggplot')
      
      # bar chart 
      # matching polygons to respective land cover 
      labels = ["leicester", "argiculture", "urban", "pasture", "water body"]
      ax1.set_xlabel("Land Cover Type")
      ax1.set_ylabel(key)
      ax1.set_title(x.split("_")[0] + " NDWI Values")
      ax1.bar(range(1, 1+len(values)), values, tick_label = labels, color='blue')
      
      # line graph 
      ax2.set_xlabel("Land Cover Type")
      ax2.set_ylabel(key)
      ax2.set_title(x.split("_")[0] + " NDWI Values")
      ax2.plot(range(1, 1+len(values)),values, color='blue')
      ax2.set_xticks([1,2,3,4,5])
      ax2.set_xticklabels(["leicester", "argiculture", "urban", "pasture", "water body"])
      
      # show the plot on screen
      plt.show()

      # save the plot to a tiff file
      filename = pklfile.split(".")[0]+ x.split("_")[0] + key + ".jpg"
      fig.savefig(filename, format="jpg")

Time Series of Mean Zonal Stats¶

In [ ]:
# read the dataframe object from file (not necessary, for demonstration only)
df = pickle.load(open(pklfile, 'rb'))

# arrange plot
fig, (ax2) = plt.subplots(1,1, figsize=(10,5))
plt.style.use('ggplot')

# line graph 
ax2.set_xlabel("Land Cover Type")
ax2.set_title("Mean NDWI Values")

# get a list of unique scene IDs from the dataframe
scene_ids = sorted(df['scene_id'].unique())

# iterate over all scene IDs in the pandas dataframe
for x in scene_ids:
  # extract the rows for that scene ID from the full dataframe
  b = df.loc[df['scene_id'] == x]

  # iterate over all columns
  for key, values in b.items():
    if key == "mean":

      ax2.plot(range(1, 1+len(values)),values, label = x.split("_")[0])
      ax2.set_ylabel(key)
      
ax2.set_xticks([1,2,3,4,5])
ax2.set_xticklabels(["Leicester", "argiculture", "urban", "pasture", "water body"])

# show the plot on screen
plt.legend()
plt.show()

Time Series of SD Zonal Stats¶

In [ ]:
# read the dataframe object from file (not necessary, for demonstration only)
df = pickle.load(open(pklfile, 'rb'))

# arrange plot
fig, (ax2) = plt.subplots(1,1, figsize=(10,5))
plt.style.use('ggplot')

# line graph 
ax2.set_xlabel("Land Cover Type")
ax2.set_title("Standard Deviation NDWI Values")

# get a list of unique scene IDs from the dataframe
scene_ids = sorted(df['scene_id'].unique())

# iterate over all scene IDs in the pandas dataframe
for x in scene_ids:
  # extract the rows for that scene ID from the full dataframe
  b = df.loc[df['scene_id'] == x]

  # iterate over all columns
  for key, values in b.items():
    if key == "std":

      ax2.plot(range(1, 1+len(values)),values, label = x.split("_")[0])
      ax2.set_ylabel(key)
      
ax2.set_xticks([1,2,3,4,5])
ax2.set_xticklabels(["Leicester", "argiculture", "urban", "pasture", "water body"])

# show the plot on screen
plt.legend()
plt.show()

Code to Convert to HTML¶

In [ ]:
!pip install -U notebook-as-pdf 

!pyppeteer-install 

!jupyter nbconvert /content/drive/MyDrive/GY7709-2022-23/229034649_GY7709_CW1.ipynb --to html